AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
bfloat16 optimization

# bfloat16 optimization

Gpt2 774M Fineweb 150B
MIT
This model originates from karpathy's llm.c project, converted to HuggingFace format for researching bfloat16 performance, with a training process consuming 150 billion tokens.
Large Language Model Transformers
G
rhysjones
22
6
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase